Shannon Entropy Estimation in $\infty$-Alphabets from Convergence Results
نویسنده
چکیده
The problem of Shannon entropy estimation in countable infinite alphabets is revisited from the adoption of convergence results of the entropy functional. Sufficient conditions for the convergence of the entropy are used, including scenarios with both finitely and infinitely supported distributions. From this angle, four plug-in histogram-based estimators are studied showing strong consistency and rate of convergences results for the case of finite and unknown supported distributions and families of distributions with summable tail bounded conditions.
منابع مشابه
On the Estimation of Shannon Entropy
Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...
متن کاملSHANNON ENTROPY IN ORDER STATISTICS AND THEIR CONCOMITANS FROM BIVARIATE NORMAL DISTRIBUTION
In this paper, we derive rst some results on the Shannon entropyin order statistics and their concomitants arising from a sequence of f(Xi; Yi): i = 1; 2; :::g independent and identically distributed (iid) random variablesfrom the bivariate normal distribution and extend our results to a collectionC(X; Y ) = f(Xr1:n; Y[r1:n]); (Xr2:n; Y[r2:n]); :::; (Xrk:n; Y[rk:n])g of order sta-tistics and th...
متن کاملUniversal Weak Variable-Length Source Coding on Countable Infinite Alphabets
Motivated from the fact that universal source coding on countably infinite alphabets is not feasible, this work introduces the notion of “almost lossless source coding”. Analog to the weak variable-length source coding problem studied by Han [3], almost lossless source coding aims at relaxing the lossless block-wise assumption to allow an average per-letter distortion that vanishes asymptotical...
متن کاملShannon entropy in generalized order statistics from Pareto-type distributions
In this paper, we derive the exact analytical expressions for the Shannon entropy of generalized orderstatistics from Pareto-type and related distributions.
متن کاملPerformance comparison of new nonparametric independent component analysis algorithm for different entropic indexes
Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...
متن کامل